55 research outputs found

    Forecasting Financial Time Series Using Model Averaging

    Get PDF
    In almost all cases a decision maker cannot identify ex ante the true process. This observation has led researchers to introduce several sources of uncertainty in forecasting exercises. In this context, the research reported in these pages finds an increase of forecasting power of financial time series when parameter uncertainty, model uncertainty and optimal decision making are included. The research contained herein evidences that although the implementation of these techniques is not often straightforward and it depends on the exercise studied, the predictive gains are statistically and economically significant over different applications, such as stock, bond and electricity markets

    Predictive gains from forecast combinations using time-varying model weights

    Get PDF
    Several frequentist and Bayesian model averaging schemes, including a new one that simultaneously allows for parameter uncertainty, model uncertainty and time varying model weights, are compared in terms of forecast accuracy over a set of simulation experiments. Artificial data are generated, characterized by low predictability, structural instability, and fat tails, which is typical for many financial-economic time series. Sensitivity of results with respect to misspecification of the number of included predictors and the number of included models is explored. Given the set up of our experiments, time varying model weight schemes outperform other averaging schemes in terms of predictive gains both when the correlation among individual forecasts is low and the underlying data generating process is subject to structural locations shifts. In an empirical application using returns on the S&P 500 index, time varying model weights provide improved forecasts with substantial economic gains in an investment strategy including transaction costs

    Predicting the Term Structure of Interest Rates: Incorporating parameter uncertainty, model uncertainty and macroeconomic information

    Get PDF
    We forecast the term structure of U.S. Treasury zero-coupon bond yields by analyzing a range of models that have been used in the literature. We assess the relevance of parameter uncertainty by examining the added value of using Bayesian inference compared to frequentist estimation techniques, and model uncertainty by combining forecasts from individual models. Following current literature we also investigate the benefits of incorporating macroeconomic information in yield curve models. Our results show that adding macroeconomic factors is very beneficial for improving the out-of-sample forecasting performance of individual models. Despite this, the predictive accuracy of models varies over time considerably, irrespective of using the Bayesian or frequentist approach. We show that mitigating model uncertainty by combining forecasts leads to substantial gains in forecasting performance, especially when applying Bayesian model averaging

    Evaluating real-time forecasts in real-time

    Get PDF
    The accuracy of real-time forecasts of macroeconomic variables that are subject to revisions may crucially depend on the choice of data used to compare the forecasts against. We put forward a flexible time-varying parameter regression framework to obtain early estimates of the final value of macroeconomic variables based upon the initial data release that may be used as actuals in current forecast evaluation. We allow for structural changes in the regression parameters to accommodate benchmark revisions and definitional changes, which fundamentally change the statistical properties of the variable of interest, including the relationship between the final value and the initial release. The usefulness of our approach is demonstrated through an empirical application comparing the accuracy of forecasts of US GDP growth rates from the Survey of Professional Forecasters and the Greenbook

    Combining Predictive Densities using Bayesian Filtering with Applications to US Economics Data

    Get PDF
    Using a Bayesian framework this paper provides a multivariate combination approach to prediction based on a distributional state space representation of predictive densities from alternative models. In the proposed approach the model set can be incomplete. Several multivariate time-varying combination strategies are introduced. In particular, a weight dynamics driven by the past performance of the predictive densities is considered and the use of learning mechanisms. The approach is assessed using statistical and utility-based performance measures for evaluating density forecasts of US macroeconomic time series and of surveys of stock market prices

    Bayesian Combinations of Stock Price Predictions with an Application to the Amsterdam Exchange Index

    Get PDF
    We summarize the general combination approach by Billio et al. [2010]. In the combination model the weights follow logistic autoregressive processes, change over time and their dynamics are possible driven by the past forecasting performances of the predictive densities. For illustrative purposes we apply it to combine White Noise and GARCH models to forecast the Amsterdam Exchange index and use the combined predictive forecasts in an investment asset allocation exercise

    Bayesian Model Averaging in the Presence of Structural Breaks

    Get PDF
    This paper develops a return forecasting methodology that allows for instabil ity in the relationship between stock returns and predictor variables, for model uncertainty, and for parameter estimation uncertainty. The predictive regres sion speci¯cation that is put forward allows for occasional structural breaks of random magnitude in the regression parameters, and for uncertainty about the inclusion of forecasting variables, and about the parameter values by em ploying Bayesian Model Averaging. The implications of these three sources of uncertainty, and their relative importance, are investigated from an active investment management perspective. It is found that the economic value of incorporating all three sources of uncertainty is considerable. A typical in vestor would be willing to pay up to several hundreds of basis points annually to switch from a passive buy-and-hold strategy to an active strategy based on a return forecasting model that allows for model and parameter uncertainty as well as structural breaks in the regression parameters

    Bayesian near-boundary analysis in basic macroeconomic time series models

    Get PDF
    Several lessons learnt from a Bayesian analysis of basic macroeconomic time series models are presented for the situation where some model parameters have substantial posterior probability near the boundary of the parameter region. This feature refers to near-instability within dynamic models, to forecasting with near-random walk models and to clustering of several economic series in a small number of groups within a data panel. Two canonical models are used: a linear regression model with autocorrelation and a simple variance components model. Several well-known time series models like unit root and error correction models and further state space and panel data models are shown to be simple generalizations of these two canonical models for the purpose of posterior inference. A Bayesian model averaging procedure is presented in order to deal with models with substantial probability both near and at the boundary of the parameter region. Analytical, graphical and empirical results using U.S. macroeconomic data, in particular on GDP growth, are presented

    Interactions between Eurozone and US Booms and Busts: A Bayesian Panel Markov-switching VAR Model

    Get PDF
    __Abstract__ Interactions between the eurozone and US booms and busts and among major eurozone economies are analyzed by introducing a panel Markov-switching VAR model well suitable for a multi-country cyclical analysis. The model accommodates changes in low and high data frequencies and endogenous time-varying transition matrices of the country-specific Markov chains. The transition matrix of each Markov chain depends on its own past history and on the history of the other chains, thus allowing for modeling of the interactions between cycles. An endogenous common eurozone cycle is derived by aggregating country-specific cycles. The model is estimated using a simulation based Bayesian approach in which an efficient multi-move strategy algorithm is defined to draw common time-varying Markov-switching chains. Our results show that the US and eurozone cycles are not fully synchronized over the 1991-2013 sample period, with evidence of more recessions in the Eurozone. Shocks affect the US 1-quarter in advance of the eurozone, but these spread very rapidly among economies. An increase in the number of eurozone countries in recession increases the probability of the US to stay within recession, while the US recession indicator has a negative impact on the probability to stay in recession for eurozone countries. Turning point analysis shows that the cycles of Germany, France and Italy are closer to the US cycle than other countries. Belgium, Spain, and Germany, provide more timely information on the aggregate recession than Netherlands and France

    Parallel Sequential Monte Carlo for Efficient Density Combination: The Deco Matlab Toolbox

    Get PDF
    This paper presents the Matlab package DeCo (Density Combination) which is based on the paper by Billio et al. (2013) where a constructive Bayesian approach is presented for combining predictive densities originating from different models or other sources of information. The combination weights are time-varying and may depend on past predictive forecasting performances and other learning mechanisms. The core algorithm is the function DeCo which applies banks of parallel Sequential Monte Carlo algorithms to filter the time-varying combination weights. The DeCo procedure has been implemented both for standard CPU computing and for Graphical Process Unit (GPU) parallel computing. For the GPU implementation we use the Matlab parallel computing toolbox and show how to use General Purposes GPU computing almost effortless. This GPU implementation comes with a speed up of the execution time up to seventy times compared to a standard CPU Matlab implementation on a multicore CPU. We show the use of the package and the computational gain of the GPU version, through some simulation experiments and empirical applications
    corecore